entropy of information source

entropy of information source
informacijos entropija statusas T sritis fizika atitikmenys: angl. entropy of information source vok. Informationsentropie, f; mittlerer Informationsgehalt, m rus. информационная энтропия, f; энтропия информации, f pranc. entropie informationnelle, f; néguentropie, f

Fizikos terminų žodynas : lietuvių, anglų, prancūzų, vokiečių ir rusų kalbomis. – Vilnius : Mokslo ir enciklopedijų leidybos institutas. . 2007.

Игры ⚽ Поможем сделать НИР

Look at other dictionaries:

  • Information source (mathematics) — In mathematics, an information source is a sequence of random variables ranging over a finite alphabet Gamma;, having a stationary distribution. The uncertainty, or entropy rate, of an information source is defined as:H{old{X}} = lim {n oinfty}… …   Wikipedia

  • Markov information source — In mathematics, a Markov information source, or simply, a Markov source, is an information source whose underlying dynamics are given by a stationary finite Markov chain. Contents 1 Formal definition 2 Applications 3 See also …   Wikipedia

  • Information theory — Not to be confused with Information science. Information theory is a branch of applied mathematics and electrical engineering involving the quantification of information. Information theory was developed by Claude E. Shannon to find fundamental… …   Wikipedia

  • Entropy (information theory) — In information theory, entropy is a measure of the uncertainty associated with a random variable. The term by itself in this context usually refers to the Shannon entropy, which quantifies, in the sense of an expected value, the information… …   Wikipedia

  • Entropy — This article is about entropy in thermodynamics. For entropy in information theory, see Entropy (information theory). For a comparison of entropy in information theory with entropy in thermodynamics, see Entropy in thermodynamics and information… …   Wikipedia

  • Entropy rate — The entropy rate of a stochastic process is, informally, the time density of the average information in a stochastic process. For stochastic processes with a countable index, the entropy rate H(X) is the limit of the joint entropy of n members of …   Wikipedia

  • Entropy encoding — In information theory an entropy encoding is a lossless data compression scheme that is independent of the specific characteristics of the medium. One of the main types of entropy coding creates and assigns a unique prefix free code to each… …   Wikipedia

  • information theory — the mathematical theory concerned with the content, transmission, storage, and retrieval of information, usually in the form of messages or data, and esp. by means of computers. [1945 50] * * * ▪ mathematics Introduction       a mathematical… …   Universalium

  • Shannon's source coding theorem — In information theory, Shannon s source coding theorem (or noiseless coding theorem) establishes the limits to possible data compression, and the operational meaning of the Shannon entropy.The source coding theorem shows that (in the limit, as… …   Wikipedia

  • History of entropy — The concept of entropy developed in response to the observation that a certain amount of functional energy released from combustion reactions is always lost to dissipation or friction and is thus not transformed into useful work . Early heat… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”